A sparse matrix approach to reverse mode automatic differentiation in Matlab

نویسندگان

  • Shaun A. Forth
  • Naveen Kr. Sharma
چکیده

We review the extended Jacobian approach to automatic differentiation of a user-supplied function and highlight the Schur complement form’s forward and reverse variants. We detail a Matlab operator overloaded approach to construct the extended Jacobian that enables the function Jacobian to be computed using Matlab’s sparse matrix operations. Memory and runtime costs are reduced using a variant of the hoisting technique of Bischof (Issues in Parallel Automatic Differentiation, 1991). On five of the six mesh-based gradient test problems from The MINPACK-2 Test Problem Collection (Averick et al, 1992) the reverse variant of our extended Jacobian technique with hoisting outperforms the sparse storage forward mode of the MAD package (Forth, ACM T. Math. Software. 32, 2006). For increasing problems size the ratio of gradient to function cpu time is seen to be bounded, if not decreasing, in line with Griewank and Walther’s (Evaluating Derivatives, SIAM, 2008) cheap gradient principle.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

High-level Interfaces for the Mad (matlab Automatic Differentiation) Package

Presently, the MAD Automatic Differentiation package for matlab comprises an overloaded implementation of forward mode AD via the fmad class. A key design feature of the fmad class is a separation of the storage and manipulation of directional derivatives into a separate derivvec class. Within the derivvec class, directional derivatives are stored as matrices (2-D arrays) allowing for the use o...

متن کامل

An Overview of High Order Reverse Mode

Automatic Differentiation (AD) is increasingly an important component of Machine Learning (ML) packages. For evaluating the gradient, the first order reverse mode also known as back-propagation, is optimal and is widely used. However, the functionalities of current mainstream ML packages for evaluating second and higher order derivatives are limited. One reason is that high order derivatives ar...

متن کامل

AMOR REPORT 2005/01 Source Transformation for MATLAB Automatic Differentiation

This report describes MSAD, a tool that applies source transformation automatic differentiation to MATLAB programs involving arbitrary vector-valued functions. The transformed programs compute both the results of the original program and the first derivatives. The current version of MSAD performs a complete source transformation for the forward mode of AD by specialising and inlining operations...

متن کامل

ADMAT: Automatic differentiation in MATLAB using object oriented methods

Differentiation is one of the fundamental problems in numerical mathematics. The solution of many optimization problems and other applications require knowledge of the gradient, the Jacobian matrix, or the Hessian matrix of a given function. Automatic differentiation (AD) is an upcoming powerful technology for computing the derivatives accurately and fast. ADMAT (Automatic Differentiation for M...

متن کامل

Forward-Mode Automatic Differentiation in Julia

We present ForwardDiff, a Julia package for forward-mode automatic differentiation (AD) featuring performance competitive with low-level languages like C++. Unlike recently developed AD tools in other popular high-level languages such as Python and MATLAB, ForwardDiff takes advantage of just-in-time (JIT) compilation to transparently recompile AD-unaware user code, enabling efficient support fo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010